248 research outputs found

    Effect of Range Condition on Steer Gains

    Get PDF
    Range is a vast resource that makes up about 43% of the world\u27s land surface, about 50% of the adjacent United States and slightly over 51% of South Dakota. This land is generally unsuited for cultivation due to low prercipitation shallow soils, poor drainage or temperature restrictions but is well adapted to grazing

    Computable bounds in fork-join queueing systems

    Get PDF
    In a Fork-Join (FJ) queueing system an upstream fork station splits incoming jobs into N tasks to be further processed by N parallel servers, each with its own queue; the response time of one job is determined, at a downstream join station, by the maximum of the corresponding tasks' response times. This queueing system is useful to the modelling of multi-service systems subject to synchronization constraints, such as MapReduce clusters or multipath routing. Despite their apparent simplicity, FJ systems are hard to analyze. This paper provides the first computable stochastic bounds on the waiting and response time distributions in FJ systems. We consider four practical scenarios by combining 1a) renewal and 1b) non-renewal arrivals, and 2a) non-blocking and 2b) blocking servers. In the case of non blocking servers we prove that delays scale as O(logN), a law which is known for first moments under renewal input only. In the case of blocking servers, we prove that the same factor of log N dictates the stability region of the system. Simulation results indicate that our bounds are tight, especially at high utilizations, in all four scenarios. A remarkable insight gained from our results is that, at moderate to high utilizations, multipath routing 'makes sense' from a queueing perspective for two paths only, i.e., response times drop the most when N = 2; the technical explanation is that the resequencing (delay) price starts to quickly dominate the tempting gain due to multipath transmissions

    Computable Bounds in Fork-Join Queueing Systems

    Full text link

    Hormonal Signal Amplification Mediates Environmental Conditions during Development and Controls an Irreversible Commitment to Adulthood

    Get PDF
    Many animals can choose between different developmental fates to maximize fitness. Despite the complexity of environmental cues and life history, different developmental fates are executed in a robust fashion. The nematode Caenorhabditis elegans serves as a powerful model to examine this phenomenon because it can adopt one of two developmental fates (adulthood or diapause) depending on environmental conditions. The steroid hormone dafachronic acid (DA) directs development to adulthood by regulating the transcriptional activity of the nuclear hormone receptor DAF-12. The known role of DA suggests that it may be the molecular mediator of environmental condition effects on the developmental fate decision, although the mechanism is yet unknown. We used a combination of physiological and molecular biology techniques to demonstrate that commitment to reproductive adult development occurs when DA levels, produced in the neuroendocrine XXX cells, exceed a threshold. Furthermore, imaging and cell ablation experiments demonstrate that the XXX cells act as a source of DA, which, upon commitment to adult development, is amplified and propagated in the epidermis in a DAF-12 dependent manner. This positive feedback loop increases DA levels and drives adult programs in the gonad and epidermis, thus conferring the irreversibility of the decision. We show that the positive feedback loop canalizes development by ensuring that sufficient amounts of DA are dispersed throughout the body and serves as a robust fate-locking mechanism to enforce an organism-wide binary decision, despite noisy and complex environmental cues. These mechanisms are not only relevant to C. elegans but may be extended to other hormonal-based decision-making mechanisms in insects and mammals

    Exploring pig trade patterns to inform the design of risk-based disease surveillance and control strategies

    Get PDF
    An understanding of the patterns of animal contact networks provides essential information for the design of risk-based animal disease surveillance and control strategies. This study characterises pig movements throughout England and Wales between 2009 and 2013 with a view to characterising spatial and temporal patterns, network topology and trade communities. Data were extracted from the Animal and Plant Health Agency (APHA)’s RADAR (Rapid Analysis and Detection of Animal-related Risks) database, and analysed using descriptive and network approaches. A total of 61,937,855 pigs were moved through 872,493 movements of batches in England and Wales during the 5-year study period. Results show that the network exhibited scale-free and small-world topologies, indicating the potential for diseases to quickly spread within the pig industry. The findings also provide suggestions for how risk-based surveillance strategies could be optimised in the country by taking account of highly connected holdings, geographical regions and time periods with the greatest number of movements and pigs moved, as these are likely to be at higher risk for disease introduction. This study is also the first attempt to identify trade communities in the country, information which could be used to facilitate the pig trade and maintain disease-free status across the country in the event of an outbreak

    Coupling models of cattle and farms with models of badgers for predicting the dynamics of bovine tuberculosis (TB)

    Get PDF
    Bovine TB is a major problem for the agricultural industry in several countries. TB can be contracted and spread by species other than cattle and this can cause a problem for disease control. In the UK and Ireland, badgers are a recognised reservoir of infection and there has been substantial discussion about potential control strategies. We present a coupling of individual based models of bovine TB in badgers and cattle, which aims to capture the key details of the natural history of the disease and of both species at approximately county scale. The model is spatially explicit it follows a very large number of cattle and badgers on a different grid size for each species and includes also winter housing. We show that the model can replicate the reported dynamics of both cattle and badger populations as well as the increasing prevalence of the disease in cattle. Parameter space used as input in simulations was swept out using Latin hypercube sampling and sensitivity analysis to model outputs was conducted using mixed effect models. By exploring a large and computationally intensive parameter space we show that of the available control strategies it is the frequency of TB testing and whether or not winter housing is practised that have the most significant effects on the number of infected cattle, with the effect of winter housing becoming stronger as farm size increases. Whether badgers were culled or not explained about 5%, while the accuracy of the test employed to detect infected cattle explained less than 3% of the variance in the number of infected cattle

    Balancing Detection and Eradication for Control of Epidemics: Sudden Oak Death in Mixed-Species Stands

    Get PDF
    Culling of infected individuals is a widely used measure for the control of several plant and animal pathogens but culling first requires detection of often cryptically-infected hosts. In this paper, we address the problem of how to allocate resources between detection and culling when the budget for disease management is limited. The results are generic but we motivate the problem for the control of a botanical epidemic in a natural ecosystem: sudden oak death in mixed evergreen forests in coastal California, in which species composition is generally dominated by a spreader species (bay laurel) and a second host species (coast live oak) that is an epidemiological dead-end in that it does not transmit infection but which is frequently a target for preservation. Using a combination of an epidemiological model for two host species with a common pathogen together with optimal control theory we address the problem of how to balance the allocation of resources for detection and epidemic control in order to preserve both host species in the ecosystem. Contrary to simple expectations our results show that an intermediate level of detection is optimal. Low levels of detection, characteristic of low effort expended on searching and detection of diseased trees, and high detection levels, exemplified by the deployment of large amounts of resources to identify diseased trees, fail to bring the epidemic under control. Importantly, we show that a slight change in the balance between the resources allocated to detection and those allocated to control may lead to drastic inefficiencies in control strategies. The results hold when quarantine is introduced to reduce the ingress of infected material into the region of interest

    Statistical modeling of holding level susceptibility to infection during the 2001 foot and mouth disease epidemic in Great Britain

    Get PDF
    AbstractBackgroundAn understanding of the factors that determine the risk of members of a susceptible population becoming infected is essential for estimating the potential for disease spread, as opposed to just focusing on transmission from an infected population. Furthermore, analysis of the risk factors can reveal important characteristics of an epidemic and further develop understanding of the processes operating.MethodsThis paper describes the development of a mixed effects logistic regression model of susceptibility of holdings to foot and mouth disease (FMD) during the 2001 epidemic in Great Britain following the imposition of a national ban on the movements of susceptible animals (NMB).ResultsThe principal risk factors identified in the model were shorter distances to the nearest infectious seed (a holding infected before the NMB) and the county of the holding (principally Cumbria). Additional risk factors included holdings that are mixed species rather than single species, the surface area of the holding, and the number of cattle within 10km (all p<0.001), but not surrounding sheep densities (p>0.1). The fit of the model was evaluated using the area under the receiver operator characteristic curve (ROC) and the Hosmer and Lemeshow Chi-squared statistic; the fit was good with both tests (area under the ROC=0.962 and Hosmer and Lemeshow Chi-squared statistic=49.98 (p>0.1)).ConclusionsHoldings at greatest risk of infection can be identified using simple readily available risk factors; this information could be employed in the control of future FMD epidemics
    corecore